Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities
نویسندگان
چکیده
Sufficient dimension reduction (SDR) is aimed at obtaining the low-rank projection matrix in the input space such that information about output data is maximally preserved. Among various approaches to SDR, a promising method is based on the eigendecomposition of the outer product of the gradient of the conditional density of output given input. In this letter, we propose a novel estimator of the gradient of the logarithmic conditional density that directly fits a linear-in-parameter model to the true gradient under the squared loss. Thanks to this simple least-squares formulation, its solution can be computed efficiently in a closed form. Then we develop a new SDR method based on the proposed gradient estimator. We theoretically prove that the proposed gradient estimator, as well as the SDR solution obtained from it, achieves the optimal parametric convergence rate. Finally, we experimentally demonstrate that our SDR method compares favorably with existing approaches in both accuracy and computational efficiency on a variety of artificial and benchmark data sets.
منابع مشابه
On model-free conditional coordinate tests for regressions
Existing model-free tests of the conditional coordinate hypothesis in sufficient dimension reduction (Cook (1998) [3]) focused mainly on the first-order estimation methods such as the sliced inverse regression estimation (Li (1991) [14]). Such testing procedures based on quadratic inference functions are difficult to be extended to second-order sufficient dimension reduction methods such as the...
متن کاملLikelihood-based Sufficient Dimension Reduction
We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.
متن کاملSufficient dimension reduction for the conditional mean with a categorical predictor in multivariate regression
Recent sufficient dimension reduction methodologies in multivariate regression do not have direct application to a categorical predictor. For this, we define the multivariate central partial mean subspace and propose two methodologies to estimate it. The first method uses the ordinary least squares. Chi-squared distributed statistics for dimension tests are constructed, and an estimate of the t...
متن کاملPenalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملSufficient Dimension Reduction via Inverse Regression: A Minimum Discrepancy Approach
A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regression estimator (IRE), is proposed, along with inference methods and a computational algorithm. The IRE has at least three desirable properties: (1) Its estimated basis of the central dimension reduction subspa...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural computation
دوره 30 2 شماره
صفحات -
تاریخ انتشار 2015